Convex Co-embedding
نویسندگان
چکیده
We present a general framework for association learning, where entities are embedded in a common latent space to express relatedness via geometry—an approach that underlies the state of the art for link prediction, relation learning, multi-label tagging, relevance retrieval and ranking. Although current approaches rely on local training methods applied to non-convex formulations, we demonstrate how general convex formulations can be achieved for entity embedding, both for standard multi-linear and prototype-distance models. We investigate an efficient optimization strategy that allows scaling. An experimental evaluation reveals the advantages of global training in different case studies.
منابع مشابه
A generalized form of the Hermite-Hadamard-Fejer type inequalities involving fractional integral for co-ordinated convex functions
Recently, a general class of the Hermit--Hadamard-Fejer inequality on convex functions is studied in [H. Budak, March 2019, 74:29, textit{Results in Mathematics}]. In this paper, we establish a generalization of Hermit--Hadamard--Fejer inequality for fractional integral based on co-ordinated convex functions.Our results generalize and improve several inequalities obtained in earlier studies.
متن کاملScalable Metric Learning for Co-Embedding
We present a general formulation of metric learning for co-embedding, where the goal is to relate objects from different sets. The framework allows metric learning to be applied to a wide range of problems—including link prediction, relation learning, multi-label tagging and ranking—while allowing training to be reformulated as convex optimization. For training we provide a fast iterative algor...
متن کاملConvex Co-Embedding for Matrix Completion with Predictive Side Information
Matrix completion as a common problem in many application domains has received increasing attention in the machine learning community. Previous matrix completion methods have mostly focused on exploiting the matrix low-rank property to recover missing entries. Recently, it has been noticed that side information that describes the matrix items can help to improve the matrix completion performanc...
متن کاملJoin-semidistributive Lattices of Relatively Convex Sets
We give two sufficient conditions for the lattice Co(Rn, X) of relatively convex sets of Rn to be join-semidistributive, where X is a finite union of segments. We also prove that every finite lower bounded lattice can be embedded into Co(Rn, X), for a suitable finite subset X of Rn.
متن کاملEmbedding Heterogeneous Data Using Statistical Models
Embedding algorithms are a method for revealing low dimensional structure in complex data. Most embedding algorithms are designed to handle objects of a single type for which pairwise distances are specified. Here we describe a method for embedding objects of different types (such as authors and terms) into a single common Euclidean space based on their co-occurrence statistics. The joint distr...
متن کاملEuclidean Embedding of Co-Occurrence Data
Embedding algorithms search for low dimensional structure in complex data, but most algorithms only handle objects of a single type for which pairwise distances are specified. This paper describes a method for embedding objects of different types, such as images and text, into a single common Euclidean space based on their co-occurrence statistics. The joint distributions are modeled as exponen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014